Improved neuronal ensemble inference with generative model and MCMC

نویسندگان

چکیده

Neuronal ensemble inference is a significant problem in the study of biological neural networks. Various methods have been proposed for from experimental data neuronal activity. Among them, Bayesian approach with generative model was recently. However, this method requires large computational cost appropriate inference. In work, we give an improved algorithm by modifying update rule Markov chain Monte Carlo and introducing idea simulated annealing hyperparameter control. We compare performance between our original one, discuss advantage method.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximate Inference with Amortised MCMC

We propose a novel approximate inference framework that approximates a target distribution by amortising the dynamics of a user-selected Markov chain Monte Carlo (MCMC) sampler. The idea is to initialise MCMC using samples from an approximation network, apply the MCMC operator to improve these samples, and finally use the samples to update the approximation network thereby improving its quality...

متن کامل

Efficient Bayesian inference for stochastic volatility models with ensemble MCMC methods

In this paper, we introduce efficient ensemble Markov Chain Monte Carlo (MCMC) sampling methods for Bayesian computations in the univariate stochastic volatility model. We compare the performance of our ensemble MCMC methods with an improved version of a recent sampler of Kastner and Fruwirth-Schnatter (2014). We show that ensemble samplers are more efficient than this state of the art sampler ...

متن کامل

PixelSNAIL: An Improved Autoregressive Generative Model

Autoregressive generative models consistently achieve the best results in density estimation tasks involving high dimensional data, such as images or audio. They pose density estimation as a sequence modeling task, where a recurrent neural network (RNN) models the conditional distribution over the next element conditioned on all previous elements. In this paradigm, the bottleneck is the extent ...

متن کامل

Learning Deep Generative Models with Doubly Stochastic MCMC

We present doubly stochastic gradient MCMC, a simple and generic method for (approximate) Bayesian inference of deep generative models in the collapsed continuous parameter space. At each MCMC sampling step, the algorithm randomly draws a minibatch of data samples to estimate the gradient of log-posterior and further estimates the intractable expectation over latent variables via a Gibbs sample...

متن کامل

Celeste: Variational inference for a generative model of astronomical images

We present a new, fully generative model of optical telescope image sets, along with a variational procedure for inference. Each pixel intensity is treated as a Poisson random variable, with a rate parameter dependent on latent properties of stars and galaxies. Key latent properties are themselves random, with scientific prior distributions constructed from large ancillary data sets. We check o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Statistical Mechanics: Theory and Experiment

سال: 2021

ISSN: ['1742-5468']

DOI: https://doi.org/10.1088/1742-5468/abffd5